|
In optimization, the line search strategy is one of two basic iterative approaches to find a local minimum of an objective function . The other approach is trust region. The line search approach first finds a descent direction along which the objective function will be reduced and then computes a step size that determines how far should move along that direction. The descent direction can be computed by various methods, such as gradient descent, Newton's method and Quasi-Newton method. The step size can be determined either exactly or inexactly. ==Example use == Here is an example gradient method that uses a line search in step 4. # Set iteration counter , and make an initial guess, for the minimum # Repeat: # Compute a descent direction # Choose to 'loosely' minimize over # Update , and # Until < tolerance At the line search step (4) the algorithm might either ''exactly'' minimize ''h'', by solving , or ''loosely'', by asking for a sufficient decrease in ''h''. One example of the former is conjugate gradient method. The latter is called inexact line search and may be performed in a number of ways, such as a backtracking line search or using the Wolfe conditions. Like other optimization methods, line search may be combined with simulated annealing to allow it to jump over some local minima. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「line search」の詳細全文を読む スポンサード リンク
|